The Alpha-Beta-Symetric Divergence and their Positive Definite Kernel

نویسندگان

  • Mactar Ndaw
  • Macoumba Ndour
  • Papa Ngom
چکیده

In this article we study the field of Hilbertian metrics and positive definit (pd) kernels on probability measures, they have a real interest in kernel methods. Firstly we will make a study based on the Alpha-Beta-divergence to have a Hilbercan metric by proposing an improvement of this divergence by constructing it so that its is symmetrical the Alpha-Beta-Symmetric-divergence (ABS-divergence) and also do some studies on these properties but also propose the kernels associated with this divergence. Secondly we will do mumerical studies incorporating all proposed metrics/kernels into support vector machine (SVM). Finally we presented a algorithm for image classification by using our divergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Infinite-dimensional Log-Determinant divergences II: Alpha-Beta divergences

This work presents a parametrized family of divergences, namely Alpha-Beta LogDeterminant (Log-Det) divergences, between positive definite unitized trace class operators on a Hilbert space. This is a generalization of the Alpha-Beta Log-Determinant divergences between symmetric, positive definite matrices to the infinite-dimensional setting. The family of Alpha-Beta Log-Det divergences is highl...

متن کامل

Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences

In this paper, we review and extend a family of log-det divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma log-det divergences many well known divergences, for example, the Stein’s loss, S-divergence, called also Jensen-Bregman LogDet (JBLD) divergence, the Logdet Zero (Bhattac...

متن کامل

Learning Discriminative Alpha-Beta-divergence for Positive Definite Matrices (Extended Version)

Symmetric positive definite (SPD) matrices are useful for capturing second-order statistics of visual data. To compare two SPD matrices, several measures are available, such as the affine-invariant Riemannian metric, Jeffreys divergence, Jensen-Bregman logdet divergence, etc.; however, their behaviors may be application dependent, raising the need of manual selection to achieve the best possibl...

متن کامل

Optimization of Alpha-Beta Log-Det Divergences and their Application in the Spatial Filtering of Two Class Motor Imagery Movements

The Alpha-Beta Log-Det divergences for positive definite matrices are flexible divergences that are parameterized by two real constants and are able to specialize several relevant classical cases like the squared Riemannian metric, the Steins loss, the S-divergence, etc. A novel classification criterion based on these divergences is optimized to address the problem of classification of the moto...

متن کامل

Sparse Coding and Dictionary Learning for Symmetric Positive Definite Matrices: A Kernel Approach

Recent advances suggest that a wide range of computer vision problems can be addressed more appropriately by considering non-Euclidean geometry. This paper tackles the problem of sparse coding and dictionary learning in the space of symmetric positive definite matrices, which form a Riemannian manifold. With the aid of the recently introduced Stein kernel (related to a symmetric version of Breg...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018